Margin-adaptive model selection in statistical learning

نویسنده

  • Sylvain Arlot
چکیده

A classical condition for fast learning rates is the margin condition, first introduced by Mammen and Tsybakov. We tackle in this paper the problem of adaptivity to this condition in the context of model selection, in a general learning framework. Actually, we consider a weaker version of this condition that allows one to take into account that learning within a small model can be much easier than within a large one. Requiring this “strong margin adaptivity” makes the model selection problem more challenging. We first prove, in a general framework, that some penalization procedures (including local Rademacher complexities) exhibit this adaptivity when the models are nested. Contrary to previous results, this holds with penalties that only depend on the data. Our second main result is that strong margin adaptivity is not always possible when the models are not nested: for every model selection procedure (even a randomized one), there is a problem for which it does not demonstrate strong margin adaptivity. AMS 2000 subject classifications: Primary 68T05, 62H30; secondary 68Q32, 62G08.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Investigating and Analysing Instructional Design and Workplace Learning Models and Selection of Adaptive Model to Optimize Organizational Training in Petrochemical Industry

The present research aimed to analyze instructional design,workplace learning, and selecting the optimum model of learning for human resources training in petrochemical industry.The previous roles have become faint and new opportunities have appeared in petrochemical industry by starting the process of privatization and changing the nature of the company from holding to a governance and develop...

متن کامل

Margin Adaptive Risk Bounds for Classification Trees

Margin adaptive risk bounds for Classification and Regression Trees (CART, Breiman et. al. 1984) classifiers are obtained in the binary supervised classification framework. These risk bounds are obtained conditionally on the construction of the maximal deep binary tree and permit to prove that the linear penalty used in the CART pruning algorithm is valid under margin condition. It is also show...

متن کامل

Enhancement of Power System Voltage Stability Using New Centralized Adaptive Load Shedding Method

This paper presents a new centralized adaptive method under frequency load shedding. Sometimes, after initial frequency drop following severe disturbances, although the system frequency returns to its permissible value, however, the system might become unstable due to voltage problems. In this regard, the paper proposes a new centralized adaptive load shedding method to enhance the voltage stab...

متن کامل

Model Selection for Multi-class SVMs

In the framework of statistical learning, fitting a model to a given problem is usually done in two steps. First, model selection is performed, to set the values of the hyperparameters. Second, training results in the selection, for this set of values, of a function performing satisfactorily on the problem. Choosing the values of the hyperparameters remains a difficult task, which has only been...

متن کامل

Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling

Supervised machine learning is an important building block for many applications that involve data processing and decision making. Good classifiers are trained to produce accurate predictions on a training set while also generalizing well to unseen data. To this end, Bayes-PointMachines (bpm) were proposed in the past as a generalization of margin maximizing classifiers, such as Support-Vector-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008